Microsoft boss troubled by rise in repor
Microsoft's head of artificial
intelligence says he is alarmed by the
rising cases of a phenomenon dubbed as
AI psychosis. It's a nonclinical term
used to describe those who rely so
heavily on chat bots such as co-pilot or
chat GPT that they become convinced
something imaginary is real. Zoe Klyman
reports.
Hugh from Scotland says he became
convinced that he was about to become a
multi-millionaire after turning to an AI
chatbot to help him when he lost his
job. It began by giving him practical
advice, but ended up telling him that a
book and a movie about his experience
would make him more than5 million. the
more like information that I gave the uh
the chatbot, the more um it would say,
"Oh, this this treatment's terrible.
Like, you should really be getting more
than this." And it would um ask me for
more information. I would I would feed
it more and the number would just get
higher and higher. It would go from like
10 grand to like all like um
like into the millions. Basically,
>> he already had mental health problems
and ended up having a breakdown. He says
taking medication made him realize the
money wasn't real. He doesn't blame the
technology. He says it signposted
citizens advice, but he ignored it
because it was so convincing. AI
psychosis is a nonclinical term to
describe people who start using chat
bots like chat GPT, Grock, and Claude
and start to lose touch with reality.
I've had messages from people convinced
that the tech has fallen in love with
them or think they've unlocked a secret
human inside it or even think it's
deliberately trying to harm them. A
survey of 2,000 UK adults carried out
for Banger University's emotional AI lab
found that 57% thought it was strongly
inappropriate for the tech to identify
as a real person if asked. But 49%
thought the use of voice was appropriate
to make chat bots sound more engaging.
20% thought children under the age of 18
shouldn't use AI at all.
>> We as professionals and doctors may
start having to ask people when we see
them in clinic how much AI they're
using, how it's affecting their lives,
just like we would for smoking, alcohol,
you know, we we already know what
ultrarocessed foods can do to the body.
And I think with this ultrarocessed
information, we're going to get an
avalanche of ultrarocessed minds that we
need to deal with.
The advice is to make sure you double
check everything a chatbot tells you and
don't stop talking to real people.
Finally, if you feel like you're using
AI to make all your decisions for you,
think about taking a step back. Zoe
Kleman, BBC News.